Revisiting Multi-Domain Machine Translation

نویسندگان

چکیده

When building machine translation systems, one often needs to make the best out of heterogeneous sets parallel data in training, and robustly handle inputs from unexpected domains testing. This multi-domain scenario has attracted a lot recent work that fall under general umbrella transfer learning. In this study, we revisit translation, with aim formulate motivations for developing such systems associated expectations respect performance. Our experiments large sample show most these are hardly met suggest further is needed better analyze current behaviour them fully hold their promises.

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Robust machine translation for multi-domain tasks

In this thesis, we investigate and extend the phrase-based approach to statistical machine translation. Due to improved concepts and algorithms, the quality of the generated translation hypotheses has been significantly improved in recent years. Still, the translation quality leaves a lot to be desired when going beyond traditional translation tasks, such as newswire articles, and when addressi...

متن کامل

A Multi-Domain Translation Model Framework for Statistical Machine Translation

While domain adaptation techniques for SMT have proven to be effective at improving translation quality, their practicality for a multi-domain environment is often limited because of the computational and human costs of developing and maintaining multiple systems adapted to different domains. We present an architecture that delays the computation of translation model features until decoding, al...

متن کامل

Revisiting Pivot Language Approach for Machine Translation

This paper revisits the pivot language approach for machine translation. First, we investigate three different methods for pivot translation. Then we employ a hybrid method combining RBMT and SMT systems to fill up the data gap for pivot translation, where the sourcepivot and pivot-target corpora are independent. Experimental results on spoken language translation show that this hybrid method s...

متن کامل

Multi-Domain Neural Machine Translation through Unsupervised Adaptation

We investigate the application of Neural Machine Translation (NMT) under the following three conditions posed by realworld application scenarios. First, we operate with an input stream of sentences coming from many different domains and with no predefined order. Second, the sentences are presented without domain information. Third, the input stream should be processed by a single generic NMT mo...

متن کامل

Neural Machine Translation Training in a Multi-Domain Scenario

In this paper, we explore alternative ways to train a neural machine translation system in a multi-domain scenario. We investigate data concatenation (with fine tuning), model stacking (multi-level fine tuning), data selection and weighted ensemble. We evaluate these methods based on three criteria: i) translation quality, ii) training time, and iii) robustness towards out-of-domain tests. Our ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Transactions of the Association for Computational Linguistics

سال: 2021

ISSN: ['2307-387X']

DOI: https://doi.org/10.1162/tacl_a_00351